Goto

Collaborating Authors

 attention-based neural cellular automata


Attention-based Neural Cellular Automata

Neural Information Processing Systems

Recent extensions of Cellular Automata (CA) have incorporated key ideas from modern deep learning, dramatically extending their capabilities and catalyzing a new family of Neural Cellular Automata (NCA) techniques. Inspired by Transformer-based architectures, our work presents a new class of NCAs formed using a spatially localized--yet globally organized--self-attention scheme. We introduce an instance of this class named .


Attention-based Neural Cellular Automata

Neural Information Processing Systems

Recent extensions of Cellular Automata (CA) have incorporated key ideas from modern deep learning, dramatically extending their capabilities and catalyzing a new family of Neural Cellular Automata (NCA) techniques. Inspired by Transformer-based architectures, our work presents a new class of attention-based NCAs formed using a spatially localized--yet globally organized--self-attention scheme. We introduce an instance of this class named Vision Transformer Cellular Automata (ViTCA). When comparing across architectures configured to similar parameter complexity, ViTCA architectures yield superior performance across all benchmarks and for nearly every evaluation metric. We present an ablation study on various architectural configurations of ViTCA, an analysis of its effect on cell states, and an investigation on its inductive biases.